Data Eng. With Data Bricks Brussels Belgium
6 Month contract with possible extemsion
Onsite/Hybrod (3 days from office)
8+ years experince
Design, develop, and optimize data workflows and notebooks using Databricks to ingest, transform, and load data from various sources into the data lake.
Build and maintain scalable and efficient data processing workflows using Spark (PySpark or Spark SQL) by following coding standards and best practices.
Collaborate with technical and business stakeholders to understand data requirements and translate them into technical solutions.
Develop data models and schemas to support reporting and analytics needs.
Ensure data quality, integrity, and security by implementing appropriate checks and controls.
Monitor and optimize data processing performance, identifying, and resolving bottlenecks.
Stay up to date with the latest advancements in data engineering and Databricks technologies.
Experience with ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes
Experience in AWS cloud platforms
Experience and certified as AWS admin
Experience with Python or SQL
Experience with Delta Lake
Experience in Dataiku
Understanding of DevOps principles and practices